Search results for "Motion recognition"

showing 10 items of 29 documents

Affective matching of odors and facial expressions in infants: shifting patterns between 3 and 7 months.

2016

Recognition of emotional facial expressions is a crucial skill for adaptive behavior. Past research suggests that at 5 to 7 months of age, infants look longer to an unfamiliar dynamic angry/happy face which emotionally matches a vocal expression. This suggests that they can match stimulations of distinct modalities on their emotional content. In the present study, olfaction-vision matching abilities were assessed across different age groups (3, 5 and 7 months) using dynamic expressive faces (happy vs. disgusted) and distinct hedonic odor contexts (pleasant, unpleasant and control) in a visual-preference paradigm. At all ages the infants were biased toward the disgust faces. This visual bias…

'Happy' faceMalegenetic structuresbehaviors[ SDV.AEN ] Life Sciences [q-bio]/Food and NutritionCognitive Neurosciencemedia_common.quotation_subjectEmotions[ SCCO.PSYC ] Cognitive science/PsychologyContext (language use)Olfaction050105 experimental psychologyDevelopmental psychologyimitationautonomic responsesemotion recognitionDevelopmental and Educational PsychologyHumans0501 psychology and cognitive sciencesbookEye Movement MeasurementsComputingMilieux_MISCELLANEOUSmedia_commonAdaptive behaviorFacial expressionyounginfants05 social sciencesintermodal perceptionInfantnewborn-infants7-month-old infantsconfigural informationbook.written_workDisgustFacial ExpressionSmellOdorFace[SCCO.PSYC]Cognitive science/PsychologyOdorantsFemaleImitationPsychology[SDV.AEN]Life Sciences [q-bio]/Food and Nutrition050104 developmental & child psychologydiscriminationDevelopmental science
researchProduct

Body Gestures and Spoken Sentences: A Novel Approach for Revealing User’s Emotions

2017

In the last decade, there has been a growing interest in emotion analysis research, which has been applied in several areas of computer science. Many authors have con- tributed to the development of emotion recognition algorithms, considering textual or non verbal data as input, such as facial expressions, gestures or, in the case of multi-modal emotion recognition, a combination of them. In this paper, we describe a method to detect emotions from gestures using the skeletal data obtained from Kinect-like devices as input, as well as a textual description of their meaning. The experimental results show that the correlation existing between body movements and spoken user sentence(s) can be u…

0209 industrial biotechnologyComputer scienceSpeech recognitionGesture Recognition02 engineering and technologycomputer.software_genreEmotion Recognition Gesture Recognition Sentiment AnalysisNonverbal communication020901 industrial engineering & automationSentiment Analysis0202 electrical engineering electronic engineering information engineeringEmotion recognitionSettore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniFacial expressionSettore INF/01 - Informaticabusiness.industry020207 software engineeringGesture recognitionEmotion RecognitionArtificial intelligencebusinesscomputerSentenceNatural language processingMeaning (linguistics)Gesture2017 IEEE 11th International Conference on Semantic Computing (ICSC)
researchProduct

Overview of the Development of Hydraulic Above Knee Prosthesis

2017

This paper presents research and development of hydraulically powered above-knee prosthesis (HAKP) and novel prosthetic foot, in order to enable transfemoral (TF) amputees perform stair ascent and other daily activities in as much as possible natural manner. Functions that need exertion of large forces and moments during locomotion such as walking up stairs and slopes cannot be naturally accomplished by commercially available microprocessor-controlled above-knee (AK) prostheses. Also, such prosthetic devices are expensive and unaffordable for major part of amputee population, so mostly used commercial prostheses are energetically passive devices. Deficiency of passive prosthetic devices is …

0209 industrial biotechnologyeducation.field_of_studymedicine.medical_specialtyComputer sciencemedicine.medical_treatmentMotion recognitionPopulation02 engineering and technologyKinematicsAbove knee prosthesisProsthesis03 medical and health sciences020901 industrial engineering & automation0302 clinical medicinePhysical medicine and rehabilitationGait (human)Stairs030220 oncology & carcinogenesismedicineeducationStair ascent
researchProduct

Testosterone and attention deficits as possible mechanisms underlying impaired emotion recognition in intimate partner violence perpetrators

2016

Several studies have reported impairments in decoding emotional facial expressions in intimate partner violence (IPV) perpetrators. However, the mechanisms that underlie these impaired skills are not well known. Given this gap in the literature, we aimed to establish whether IPV perpetrators (n = 18) differ in their emotion decoding process, attentional skills, and testosterone (T), cortisol (C) levels and T/C ratio in comparison with controls (n = 20), and also to examine the moderating role of the group and hormonal parameters in the relationship between attention skills and the emotion decoding process. Our results demonstrated that IPV perpetrators showed poorer emotion recognition and …

050103 clinical psychologylcsh:BF1-990educationPsychological interventionCortisolDevelopmental psychology03 medical and health sciences0302 clinical medicine0501 psychology and cognitive sciencesAttentionTestosteroneEmotion recognitionAttention deficitslcsh:K5000-5582Applied PsychologyFacial expression05 social sciencesTestosterone (patch)Intimate partner violencelcsh:Psychologylcsh:Criminal law and procedureDomestic violenceAttention switchingEmotion recognitionPsychologyLaw030217 neurology & neurosurgery
researchProduct

Facial emotion recognition in children and adolescents with specific learning disorder

2020

(1) Background: Some recent studies suggest that children and adolescents with different neurodevelopmental disorders perform worse in emotions recognition through facial expressions (ER) compared with typically developing peers. This impairment is also described in children with Specific Learning Disorders (SLD), compromising their scholastic achievement, social functioning, and quality of life. The purpose of our study is to evaluate ER skills in children and adolescents with SLD compared to a control group without learning disorders, and correlate them with intelligence and executive functions. (2) Materials and Methods: Our work is a cross-sectional observational study. Sixty-three chil…

Adolescentmedia_common.quotation_subjectAngerArticlelcsh:RC321-57103 medical and health sciences0302 clinical medicineExecutive functionSpecific Learning DisorderadolescentsFacial emotion recognitionlcsh:Neurosciences. Biological psychiatry. NeuropsychiatryChildrenmedia_commonFacial expressionIntelligence quotientWorking memoryGeneral NeuroscienceNeuropsychologyExecutive functionsexecutive functions030227 psychiatryfacial emotion recognition; specific learning disorder; children; adolescents; executive functionsSpecific learning disorderObservational studyPsychology030217 neurology & neurosurgeryClinical psychology
researchProduct

Examining facial emotion recognition as an intermediate phenotype for psychosis: Findings from the EUGEI study

2022

The EUGEI project was supported by the European Community’s Seventh Framework Program under grant agreement No. HEALTH-F2- 2009-241909 (Project EU-GEI). Dr. Arango was supported by the Spanish Ministry of Science and Innovation; Instituto de Salud Carlos III (SAM16-PE07CP1, PI16/02012, PI19/024); CIBERSAM (...)

AdultMalePsychosisGENETIC RISKInterviews as Topic03 medical and health sciencesSTRUCTURED INTERVIEW0302 clinical medicinePolygenic risk scoreRisk FactorsSocial cognitionIMPUTATIONmedicineHumansPOLYGENIC RISKEmotion recognitionAssociation (psychology)Biological PsychiatryEmotionPharmacologyIntermediate phenotypebusiness.industrySiblingsUNAFFECTED SIBLINGSRegression analysisASSOCIATIONGenomicsmedicine.diseaseSocial cognition030227 psychiatrySchizotypal traitsINDIVIDUALSPolygenic risk scoresPhenotypePsychotic DisordersSchizophreniaRELIABILITYStructured interviewSchizophreniaFemalebusinessFacial Recognition030217 neurology & neurosurgeryClinical psychologyProgress in Neuro-Psychopharmacology and Biological Psychiatry
researchProduct

Emotion recognition from facial expressions: a normative study of the Ekman 60-Faces Test in the Italian population.

2013

The Ekman 60-Faces (EK-60F) Test is a well-known neuropsychological tool assessing emotion recognition from facial expressions. It is the most employed task for research purposes in psychiatric and neurological disorders, including neurodegenerative diseases, such as the behavioral variant of Frontotemporal Dementia (bvFTD). Despite its remarkable usefulness in the social cognition research field, to date, there are still no normative data for the Italian population, thus limiting its application in a clinical context. In this study, we report procedures and normative data for the Italian version of the test. A hundred and thirty-two healthy Italian participants aged between 20 and 79 years…

AdultMalemedia_common.quotation_subjectEmotion classificationEmotionsContext (language use)DermatologyAngerNeuropsychological TestsDevelopmental psychologyYoung AdultSocial cognitionReference ValuesHumansmedia_commonAgedFacial expressionGeneral MedicineMiddle AgedEkman 60-faces testDisgustSadnessFacial ExpressionPsychiatry and Mental healthItalyPattern Recognition VisualFaceNormativeEducational StatusFemaleNeurology (clinical)Emotion recognitionPsychologyMental Status SchedulePhotic StimulationNeurological sciences : official journal of the Italian Neurological Society and of the Italian Society of Clinical Neurophysiology
researchProduct

Speech Emotion Recognition method using time-stretching in the Preprocessing Phase and Artificial Neural Network Classifiers

2020

Human emotions are playing a significant role in the understanding of human behaviour. There are multiple ways of recognizing human emotions, and one of them is through human speech. This paper aims to present an approach for designing a Speech Emotion Recognition (SER) system for an industrial training station. While assembling a product, the end user emotions can be monitored and used as a parameter for adapting the training station. The proposed method is using a phase vocoder for time-stretching and an Artificial Neural Network (ANN) for classification of five typical different emotions. As input for the ANN classifier, features like Mel Frequency Cepstral Coefficients (MFCCs), short-te…

Artificial neural networkComputer scienceSpeech recognitionPhase vocoderAudio time-scale/pitch modification020206 networking & telecommunications02 engineering and technologyComputingMethodologies_PATTERNRECOGNITION0202 electrical engineering electronic engineering information engineeringPreprocessor020201 artificial intelligence & image processingMel-frequency cepstrumEmotion recognitionClassifier (UML)Speech rate2020 IEEE 16th International Conference on Intelligent Computer Communication and Processing (ICCP)
researchProduct

Class discovery from semi-structured EEG data for affective computing and personalisation

2017

The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link. Many approaches to recognising emotions from metrical data such as EEG signals rely on identifying a very small number of classes and to train a classifier. The interpretation of these classes varies from a single emotion such as stress [24] to features of emotional model such as valence-arousal [4]. There are two major issues here. First classification approach limits the analysis of the data within the selected classes and is also highly dependent on training data/cycles, all of which limits generalisation. Second issue is that it does not exp…

Brain modelingComputer scienceFeature extraction02 engineering and technologyElectroencephalographyMachine learningcomputer.software_genrePersonalizationCorrelationDEAP03 medical and health sciences0302 clinical medicine0202 electrical engineering electronic engineering information engineeringmedicineCluster analysisAffective computingmedicine.diagnostic_testbusiness.industryElectroencephalographySelf-organizing feature mapsFeature extraction020201 artificial intelligence & image processingArtificial intelligenceEmotion recognitionbusinessClassifier (UML)computer030217 neurology & neurosurgery
researchProduct

Combining Supervised and Unsupervised Learning to Discover Emotional Classes

2017

Most previous work in emotion recognition has fixed the available classes in advance, and attempted to classify samples into one of these classes using a supervised learning approach. In this paper, we present preliminary work on combining supervised and unsupervised learning to discover potential latent classes which were not initially considered. To illustrate the potential of this hybrid approach, we have used a Self-Organizing Map (SOM) to organize a large number of Electroencephalogram (EEG) signals from subjects watching videos, according to their internal structure. Results suggest that a more useful labelling scheme could be produced by analysing the resulting topology in relation t…

Computer science050109 social psychologyuser modelling02 engineering and technologyMachine learningcomputer.software_genrePersonalization0202 electrical engineering electronic engineering information engineering0501 psychology and cognitive sciencesEmotion recognitionEEGValence (psychology)Affective computingaffective computingclass discoverybusiness.industry05 social sciencesSupervised learningPattern recognitionHybrid approachComputingMethodologies_PATTERNRECOGNITIONUnsupervised learning020201 artificial intelligence & image processingArtificial intelligencebusinesscomputercluster analysis
researchProduct